Consistency Theorems for Discrete Bayesian Learning

نویسنده

  • Jan Poland
چکیده

Bayes’ rule specifies how to obtain a posterior from a class of hypotheses endowed with a prior and the observed data. There are three fundamental ways to use this posterior for predicting the future: marginalization (integration over the hypotheses w.r.t. the posterior), MAP (taking the a posteriori most probable hypothesis), and stochastic model selection (selecting a hypothesis at random according to the posterior distribution). If the hypothesis class is countable and contains the data generating distribution (this is termed the “realizable case”), strong consistency theorems are known for the former two methods, asserting almost sure convergence of the predictions to the truth as well as loss bounds. We prove corresponding results for stochastic model selection, for both discrete and continuous observation spaces. As a main technical tool, we will use the concept of a potential: this quantity, which is always positive, measures the total possible amount of future prediction errors. Precisely, in each time step, the expected potential decrease upper bounds the expected error. We introduce the entropy potential of a hypothesis class as its worst-case entropy with regard to the true distribution. Our results are proven within a general stochastic online prediction framework that comprises both online classification and prediction of non-i.i.d. sequences.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strong Limit Theorems for the Bayesian Scoring Criterion in Bayesian Networks

In the machine learning community, the Bayesian scoring criterion is widely used for model selection problems. One of the fundamental theoretical properties justifying the usage of the Bayesian scoring criterion is its consistency. In this paper we refine this property for the case of binomial Bayesian network models. As a by-product of our derivations we establish strong consistency and obtain...

متن کامل

Consistency of Learning Bayesian Network Structures with Continuous Variables: An Information Theoretic Approach

We consider the problem of learning a Bayesian network structure given n examples and the prior probability based on maximizing the posterior probability. We propose an algorithm that runs in O(n log n) time and that addresses continuous variables and discrete variables without assuming any class of distribution. We prove that the decision is strongly consistent, i.e., correct with probability ...

متن کامل

On initial data in the problem of consistency on cubic lattices for 3 x 3 determinants

The paper is devoted to complete proofs of theorems on consistency on cubic lattices for 3 × 3 determinants. The discrete nonlinear equations on Z defined by the condition that the determinants of all 3×3 matrices of values of the scalar field at the points of the lattice Z that form elementary 3 × 3 squares vanish are considered; some explicit concrete conditions of general position on initial...

متن کامل

On Initial Data in the Problem of Consistency on Cubic Lattices for $3 \times 3$ Determinants

The paper is devoted to complete proofs of theorems on consistency on cubic lattices for 3 × 3 determinants. The discrete nonlinear equations on Z defined by the condition that the determinants of all 3×3 matrices of values of the scalar field at the points of the lattice Z that form elementary 3 × 3 squares vanish are considered; some explicit concrete conditions of general position on initial...

متن کامل

Measure Transformer Semantics for Bayesian Machine Learning

The Bayesian approach to machine learning amounts to computing posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006